1,210 research outputs found
Mathematical modeling tendencies in plant pathology
Nowadays plant diseases represent one of the major threats for crops around the world, because they carry healthy, economical, environmental and social problems. Considering this, it is necessary to have a description of the dynamics of plant disease in order to have sustainable strategies to prevent and diminish the impact of the diseases in crops. Mathematical tools have been employed to create models which give a description of epidemic dynamics; the commonly mathematical tools used are: Diseaseprogress curves, Linked Differential Equation (LDE), Area Under disease Progress Curve (AUDPC) and computer simulation. Nevertheless, there are other tools that have been employed in epidemiology of plant disease like: statistical tools, visual evaluations and pictorial assessment. Each tool has its own advantages and disadvantages. The nature of the problem and the epidemiologist necessities determine the mathematical tool to be used and the variables to be included into the model. This paperpresents review of the tools used in epidemiology of plant disease remarking their advantages and disadvantages and mathematical modeling tendencies in plant pathology
SQG-Differential Evolution for difficult optimization problems under a tight function evaluation budget
In the context of industrial engineering, it is important to integrate
efficient computational optimization methods in the product development
process. Some of the most challenging simulation-based engineering design
optimization problems are characterized by: a large number of design variables,
the absence of analytical gradients, highly non-linear objectives and a limited
function evaluation budget. Although a huge variety of different optimization
algorithms is available, the development and selection of efficient algorithms
for problems with these industrial relevant characteristics, remains a
challenge. In this communication, a hybrid variant of Differential Evolution
(DE) is introduced which combines aspects of Stochastic Quasi-Gradient (SQG)
methods within the framework of DE, in order to improve optimization efficiency
on problems with the previously mentioned characteristics. The performance of
the resulting derivative-free algorithm is compared with other state-of-the-art
DE variants on 25 commonly used benchmark functions, under tight function
evaluation budget constraints of 1000 evaluations. The experimental results
indicate that the new algorithm performs excellent on the 'difficult' (high
dimensional, multi-modal, inseparable) test functions. The operations used in
the proposed mutation scheme, are computationally inexpensive, and can be
easily implemented in existing differential evolution variants or other
population-based optimization algorithms by a few lines of program code as an
non-invasive optional setting. Besides the applicability of the presented
algorithm by itself, the described concepts can serve as a useful and
interesting addition to the algorithmic operators in the frameworks of
heuristics and evolutionary optimization and computing
Abnormal cognition, sleep, EEG and brain metabolism in a novel knock-in Alzheimer mouse, PLB1
Peer reviewedPublisher PD
Web-based monitoring tools for Resistive Plate Chambers in the CMS experiment at CERN
The Resistive Plate Chambers (RPC) are used in the CMS experiment at the trigger level and also in the standard offline muon reconstruction. In order to guarantee the quality of the data collected and to monitor online the detector performance, a set of tools has been developed in CMS which is heavily used in the RPC system. The Web-based monitoring (WBM) is a set of java servlets that allows users to check the performance of the hardware during data taking, providing distributions and history plots of all the parameters. The functionalities of the RPC WBM monitoring tools are presented along with studies of the detector performance as a function of growing luminosity and environmental conditions that are tracked over time
Radiation background with the CMS RPCs at the LHC
The Resistive Plate Chambers (RPCs) are employed in the CMS Experiment at the LHC as dedicated trigger system both in the barrel and in the endcap. This article presents results of the radiation background measurements performed with the 2011 and 2012 proton-proton collision data collected by CMS. Emphasis is given to the measurements of the background distribution inside the RPCs. The expected background rates during the future running of the LHC are estimated both from extrapolated measurements and from simulation
Study of decays to the final state and evidence for the decay
A study of decays is performed for the first time
using data corresponding to an integrated luminosity of 3.0
collected by the LHCb experiment in collisions at centre-of-mass energies
of and TeV. Evidence for the decay
is reported with a significance of 4.0 standard deviations, resulting in the
measurement of
to
be .
Here denotes a branching fraction while and
are the production cross-sections for and mesons.
An indication of weak annihilation is found for the region
, with a significance of
2.4 standard deviations.Comment: All figures and tables, along with any supplementary material and
additional information, are available at
https://lhcbproject.web.cern.ch/lhcbproject/Publications/LHCbProjectPublic/LHCb-PAPER-2016-022.html,
link to supplemental material inserted in the reference
Enhancing quantum efficiency of thin-film silicon solar cells by Pareto optimality
We present a composite design methodology for the simulation and optimization of the solar cell performance. Our method is based on the synergy of different computational techniques and it is especially designed for the thin-film cell technology. In particular, we aim to efficiently simulate light trapping and plasmonic effects to enhance the light harvesting of the cell. The methodology is based on the sequential application of a hierarchy of approaches: (a) full Maxwell simulations are applied to derive the photonâs scattering probability in systems presenting textured interfaces; (b) calibrated Photonic Monte Carlo is used in junction with the scattering matrices method to evaluate coherent and scattered photon absorption in the full cell architectures; (c) the results of these advanced optical simulations are used as the pair generation terms in model implemented in an effective Technology Computer Aided Design tool for the derivation of the cell performance; (d) the models are investigated by qualitative and quantitative sensitivity analysis algorithms, to evaluate the importance of the design parameters considered on the models output and to get a first order descriptions of the objective space; (e) sensitivity analysis results are used to guide and simplify the optimization of the model achieved through both Single Objective Optimization (in order to fully maximize devices efficiency) and Multi Objective Optimization (in order to balance efficiency and cost); (f) Local, Global and âGlocalâ robustness of optimal solutions found by the optimization algorithms are statistically evaluated; (g) data-based Identifiability Analysis is used to study the relationship between parameters. The results obtained show a noteworthy improvement with respect to the quantum efficiency of the reference cell demonstrating that the methodology presented is suitable for effective optimization of solar cell devices
Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV
The performance of muon reconstruction, identification, and triggering in CMS
has been studied using 40 inverse picobarns of data collected in pp collisions
at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection
criteria covering a wide range of physics analysis needs have been examined.
For all considered selections, the efficiency to reconstruct and identify a
muon with a transverse momentum pT larger than a few GeV is above 95% over the
whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4,
while the probability to misidentify a hadron as a muon is well below 1%. The
efficiency to trigger on single muons with pT above a few GeV is higher than
90% over the full eta range, and typically substantially better. The overall
momentum scale is measured to a precision of 0.2% with muons from Z decays. The
transverse momentum resolution varies from 1% to 6% depending on pseudorapidity
for muons with pT below 100 GeV and, using cosmic rays, it is shown to be
better than 10% in the central region up to pT = 1 TeV. Observed distributions
of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO
Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV
The performance of muon reconstruction, identification, and triggering in CMS
has been studied using 40 inverse picobarns of data collected in pp collisions
at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection
criteria covering a wide range of physics analysis needs have been examined.
For all considered selections, the efficiency to reconstruct and identify a
muon with a transverse momentum pT larger than a few GeV is above 95% over the
whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4,
while the probability to misidentify a hadron as a muon is well below 1%. The
efficiency to trigger on single muons with pT above a few GeV is higher than
90% over the full eta range, and typically substantially better. The overall
momentum scale is measured to a precision of 0.2% with muons from Z decays. The
transverse momentum resolution varies from 1% to 6% depending on pseudorapidity
for muons with pT below 100 GeV and, using cosmic rays, it is shown to be
better than 10% in the central region up to pT = 1 TeV. Observed distributions
of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO
- âŚ